A wide range of Azure data can be ingested into Splunk, including usage information and resource logs (HTTP, Console, Platform, etc). By ingesting these logs, resource data can be explored alongside usage data to identify the reasons behind unexpected costs. Additionally, in an organisational environment, by viewing this data in Splunk, dashboards can be created and modified for specific departments or people, ensuring they only have access to data relevant to their role.
In these articles, we will use the Splunk Add-on for Microsoft Cloud Services, which provides all the features necessary to ingest and search Azure data.
Add-on Installation
Installing the Splunk Add-On for Microsoft Cloud Services will differ depending on your Splunk environment; the only instance that requires the add-on is Search Heads, as it requires Microsoft Cloud Services knowledge management. Universal Forwarders are the only instance this add-on does not support, as it requires Python and the Splunk REST handler.
Single-Instanced Splunk Enterprise
See installation article.
Distributed Splunk Enterprise
See installation article; remember, this Add-On is not compatible with Universal Forwarders.
Add-on Setup
To use the APIs this add-on uses, there is some pre-requisite setup depending on the data we want to ingest.
Data to Ingest | Required Component |
---|---|
Azure Resource | Microsoft Entra Application |
Azure Audit | Microsoft Entra Application |
Azure Event Hub | Microsoft Entra Application |
Azure Metrics | Microsoft Entra Application |
Azure KQL Log Analytics | Microsoft Entra Application |
Azure Consumption (Billing) | Microsoft Entra Application |
Microsoft Entra Application
Required for the following resources:
- Azure Resource
- Azure Audit
- Azure Event Hub
- Azure Metrics
- Azure KQL Log Analytics
- Azure Consumption (Billing)
The add-on obtains data from the above resources using the Windows Azure Service Management APIs. To access these APIs, we must create a Microsoft Entra Application (previously called an Azure Active Directory Application) and then assign it a role so it has the correct permissions to access our resources.
Note – The permission Application.ReadWrite.All
is required by the user registering the application and assigning it a role.
Creating the Application
- Sign in to the Microsoft Entra Admin Centre.
- Using the left navigation bar, head to Identity > Applications > App registrations then click New registration.
- Give the application a name; in this example, we used
splunk-demo
.
We now have an Entra Application; we must assign it a role to use it.
Assigning Application’s Roles
- Go to the Azure Portal. We should still be logged in, but if not, log in.
- Head to Home > Subscriptions.
- Select the subscription we want to use from the Subscription Name column.
- Select Access control (IAM).
- Select Add, then Add Role Assignments.
- Within the role tab, select the Reader role.
a. The reader role will cover most resources, but in order to read event hubs, we must also assign the permissionAzure Event Hubs Data Receiver
- Select Next.
- Within the Members tab, select Assign access to, then select User, group, or service principal.
- Select Select members. By default, Microsoft Entra applications aren’t displayed in the available options. To find our application, Search for it by its name; in our case
splunk-demo
. - Select the Select button, then select Review + assign.
Now our application has the correct role, we must obtain the application credentials.
Obtaining Application Credentials
- Return to the Microsoft Entra Admin Centre. We should still be logged in, but if not, log in.
- Using the left navigation bar, head to Identity > Applications > App registrations, then click on your application from the Display name column.
- Within the Essentials dropdown, take note of:
a. Application (client) ID – known as our Client ID.
b. Directory (tenant) ID – known as our Tenant ID. - Select Certificates & secrets.
- Select Client secrets then, + New client secret.
- Provide a description and a duration.
- Select Add.
- Note down the Value – known as our Key or Client Secret.
a. Important – Note this down before leaving the page. This value is only displayed once.
b. We can safely ignore Secret ID.
Our application credentials (known on the Add-on as our account attributes) should be as follows
Account Attributes
Attribute | Name in Splunk Web | Description |
---|---|---|
account_stanza_name | Name | Enter a friendly name for your Azure app account. Account name cannot contain any whitespace. |
client_id | Client ID | Listed as Application (client) ID on Azure |
client_secret | Key (Client Secret) | Listed as Value on Azure |
tenant_id | Tenant ID | Listed as Directory (tenant) ID on Azure |
Linking our Application to Splunk
We can now connect our application to Splunk using the credentials obtained above. This can be done through Splunk Web or Splunk’s configuration files.
Splunk Web
- Launch the add-on, then select Configuration.
- Select Azure App Account > Add Azure App Account.
- Enter a friendly Name for the account.
- Enter the Client ID, Key (Client Secret) and Tenant ID using the following Account Attributes table above,
- Select Add.
Configuration Files
- Create or open
$SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local/mscs_azure_accounts.conf
. - Add the following stanza using the Account Attributes table above:
[<account_stanza_name>] client_id = <value> client_secret = <value> tenant_id = <value>
Ingesting Data
Now the add-on is installed and configured, we can begin ingesting our data into Splunk. For the purpose of this article, the add-on has just been installed on a Single Splunk Instance. In this article, we will go over Azure Consumption (billing) and Event Hubs.
Azure Consumption (Billing)
Here are the fields available for Azure Consumption and what they mean:
- Name – A unique name for our input
- Azure App Account – Select the Azure Application we made earlier
- Subscription ID – The subscription we want to monitor. We can get this ID from:
- Azure Portal
- Home > Subscriptions > Select our Subscription from Subscription name column
- Subscription ID
- Data Type – Usage Details to collect usage details data or Reservation Recommendation to collect reservation recommendation data
- Interval – How often to read usage data, default 86400
- Index – The index our data will be stored in
- Sourcetype – The sourcetype the ingested data will use, default
mscs:consumption:billing
for Usage Details andmscs:consumption:reservation:recommendation
for Reservation Recommendation - Query Days – Specify the maximum number of days to query; the default is 10.
- Start Date – Select a Start Date to specify how far back to go when initially collecting data, the default is 90. When the Usage Details data type is selected, the start date is used to calculate the Usage Details API query date range. The end date is the start date plus the number of days specified by Query Days. For example, if the start date is 2022-01-01T00:00:00 and Query Days is 10, the end date is 2022-01-11T00:00:00.
Splunk Web
- Launch the add-on, then select Inputs.
- Select Create New Input > Azure Consumption(Billing).
- Fill in the fields with the above data. See the image below as an example.
- Select Add.
Configuration Files
- In your Splunk platform deployment, navigate to:
`$SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local
.
- Create a file named inputs.conf, if it does not already exist.
- Add the following stanza for consumption input:
- Input configuration for the Usage Details data type
- Input configuration for the Usage Details data type
[mscs_azure_consumption://<input_stanza_name>]
account = <value>
data_type = Usage Details
index = <value>
interval = 86400
query_days = <value>
sourcetype = mscs:consumption:billing
start_date = <value>
subscription_id = <value>
- Input configuration for Reservation Recommendation data type
[mscs_azure_consumption://<input_stanza_name>]
account = <value>
data_type = Reservation Recommendation
index = <value>
interval = 86400
sourcetype = mscs:consumption:reservation:recommendation
subscription_id = <value>
5. Save and restart the Splunk platform.
Azure Event Hubs
Configure your inputs using Splunk Web on the Splunk platform instance responsible for collecting data for this add-on, usually a heavy forwarder, in our case, we’re using a Single Splunk Instance.
Here are the fields available for Azure Event Hubs and what they mean:
- Name – A unique name for our input.
- Azure App Account – Select the Azure Application we made earlier.
- Event Hub Namespace (FQDN) – Known as the Event Hub Namespaces “Host name”, for example,
azure-to-splunk-demo.servicebus.windows.net
. We can get this from: - Azure Portal
- Home > Event Hubs > Select our Event Hub Namespaces from Name column
- Host name
- Event Hub Name – The name of the Event Hub within the Event Hub Namespace
- Consumer Group – The Azure Event Hub Consumer Group.
- Max Wait Time – The maximum interval in seconds the event processor will wait before processing. The default is 300 seconds.
- Max Batch Size – The maximum number of events to retrieve in one batch. The default is 300.
- Transport Type – AMQP over Websocket or AMQP.
- Index – The index the data will be stored in.
- Sourcetype – The sourcetype the ingested data will use, default
mscs:azure:eventhub
. - Interval – The number of seconds to wait before the Splunk platform reruns the command. The default is 3600 seconds.
- Enable Blob Checkpoint Store – Enable storage blob as a checkpoint for event hub input. Using this will require two additional inputs:
- Azure Storage Account – The Azure Storage account in which the container is created to store event hub checkpoints.
- Container Name – Enter the container name under the storage account. You can only add one container name for each input.
Splunk Web
- Launch the add-on, then select Inputs.
- Select Create New Input > Azure Event Hub.
- Fill in the fields with the above data. See the image below as an example.
- Select Add.
Configuration Files
- In your Splunk platform deployment, navigate to:
$SPLUNK_HOME/etc/apps/Splunk_TA_microsoft-cloudservices/local
.
- Create a file named inputs.conf, if it does not already exist.
- Add the following stanza for the Event Hub input:
[<input_stanza_name>]
account = <value>
blob_checkpoint_enabled = <value>
storage_account = <value>
container_name = <value>
consumer_group = <value>
event_hub_name = <value>
event_hub_namespace = <value>
container_name = <value>
index = <value>
interval = <value>
max_batch_size = <value>
max_wait_time = <value>
use_amqp_over_websocket = 1
sourcetype = mscs:azure:eventhub
- Save and restart the Splunk platform.